alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
alt
---
title: "Adaptive Multilayer Perceptron"
output:
flexdashboard::flex_dashboard:
storyboard: true
social: menu
source: embed
---
```{r setup, include=FALSE}
library(flexdashboard)
```
### Apresentation.

### Back to version 1

### What is the objective again ?

### Version 2.

### Change the number of neurons.

### Change the number of layers.

### Implementation challenges.

***
1. Add more layer at will
2. Generalization
### Output Layer.

### Hidden Layer 2.

### Hidden Layer 1.

### Perception.

***
1. Cross-entropy loss:
- each epoch there’s a error loss
2. Batch accuracy:
- for each 10 rows of data there will be a accuracy among them
3. Convergence rate:
- for each 10 epochs the rate of convergence (error loss vs number of epochs) will be calculated
### Demo

### Next Steps

### Search of best configurations.

***
1. Hypothetical case
- ANN with 4 layer maximum (1 inputs 2 hidden 1 output)
- 3 variants of number of neurons for each hidden layer
- 2 variants of number of layers (with 1 or 2 hidden layers)
- 3 types of activations functions
- 243 possibilities with 4 layers
- 27 possibilities with 3 layers
- Total of 270 possibilities
### Finding the best architecture.

***
1. Exhaustive search is not a good idea
2. Intuitively decide the number of neurons, layers and activations functions
or maybe
3. We can use a meta heuristic to search in the search space
genetic algorithm ?
### Genetic Algorithm Search.

***
1. Each component is equivalent to a gene
2. Each configuration is a individual
3. The change of configuration is related to the crossing of two individuals or a mutation
4. Genetic operations with MEMORY!!
### Questions?
